31 research outputs found

    Subject-Independent Detection of Movement-Related Cortical Potentials and Classifier Adaptation from Single-Channel EEG

    Get PDF

    Detection of Attempted Stroke Hand Motions from Surface EMG

    Get PDF

    Classification of hand grasp kinetics and types using movement-related cortical potentials and EEG rhythms

    Get PDF
    Detection of single-trial movement intentions from EEG is paramount for brain-computer interfacing in neurorehabilitation. These movement intentions contain task-related information and if this is decoded, the neurorehabilitation could potentially be optimized. The aim of this study was to classify single-trial movement intentions associated with two levels of force and speed and three different grasp types using EEG rhythms and components of the movement-related cortical potential (MRCP) as features. The feature importance was used to estimate encoding of discriminative information. Two data sets were used. 29 healthy subjects executed and imagined different hand movements, while EEG was recorded over the contralateral sensorimotor cortex. The following features were extracted: delta, theta, mu/alpha, beta, and gamma rhythms, readiness potential, negative slope, and motor potential of the MRCP. Sequential forward selection was performed, and classification was performed using linear discriminant analysis and support vector machines. Limited classification accuracies were obtained from the EEG rhythms and MRCP-components: 0.48±0.05 (grasp types), 0.41±0.07 (kinetic profiles, motor execution), and 0.39±0.08 (kinetic profiles, motor imagination). Delta activity contributed the most but all features provided discriminative information. These findings suggest that information from the entire EEG spectrum is needed to discriminate between task-related parameters from single-trial movement intentions

    Implementing Performance Accommodation Mechanisms in Online BCI for Stroke Rehabilitation: A Study on Perceived Control and Frustration

    Get PDF
    Brain–computer interfaces (BCIs) are successfully used for stroke rehabilitation, but the training is repetitive and patients can lose the motivation to train. Moreover, controlling the BCI may be difficult, which causes frustration and leads to even worse control. Patients might not adhere to the regimen due to frustration and lack of motivation/engagement. The aim of this study was to implement three performance accommodation mechanisms (PAMs) in an online motor imagery-based BCI to aid people and evaluate their perceived control and frustration. Nineteen healthy participants controlled a fishing game with a BCI in four conditions: (1) no help, (2) augmented success (augmented successful BCI-attempt), (3) mitigated failure (turn unsuccessful BCI-attempt into neutral output), and (4) override input (turn unsuccessful BCI-attempt into successful output). Each condition was followed-up and assessed with Likert-scale questionnaires and a post-experiment interview. Perceived control and frustration were best predicted by the amount of positive feedback the participant received. PAM-help increased perceived control for poor BCI-users but decreased it for good BCI-users. The input override PAM frustrated the users the most, and they differed in how they wanted to be helped. By using PAMs, developers have more freedom to create engaging stroke rehabilitation games

    Feature- and classification analysis for detection and classification of tongue movements from single-trial pre-movement EEG

    Get PDF
    Individuals with severe tetraplegia can benefit from brain-computer interfaces (BCIs). While most movement-related BCI systems focus on right/left hand and/or foot movements, very few studies have considered tongue movements to construct a multiclass BCI. The aim of this study was to decode four movement directions of the tongue (left, right, up, and down) from single-trial pre-movement EEG and provide a feature and classifier investigation. In offline analyses (from ten individuals without a disability) detection and classification were performed using temporal, spectral, entropy, and template features classified using either a linear discriminative analysis, support vector machine, random forest or multilayer perceptron classifiers. Besides the 4-class classification scenario, all possible 3-, and 2-class scenarios were tested to find the most discriminable movement type. The linear discriminant analysis achieved on average, higher classification accuracies for both movement detection and classification. The right- and down tongue movements provided the highest and lowest detection accuracy (95.3±4.3% and 91.7±4.8%), respectively. The 4-class classification achieved an accuracy of 62.6±7.2%, while the best 3-class classification (using left, right, and up movements) and 2-class classification (using left and right movements) achieved an accuracy of 75.6±8.4% and 87.7±8.0%, respectively. Using only a combination of the temporal and template feature groups provided further classification accuracy improvements. Presumably, this is because these feature groups utilize the movement-related cortical potentials, which are noticeably different on the left- versus right brain hemisphere for the different movements. This study shows that the cortical representation of the tongue is useful for extracting control signals for multi-class movement detection BCIs

    Manual 3D Control of an Assistive Robotic Manipulator Using Alpha Rhythms and an Auditory Menu:A Proof-of-Concept

    Get PDF
    Brain–Computer Interfaces (BCIs) have been regarded as potential tools for individuals with severe motor disabilities, such as those with amyotrophic lateral sclerosis, that render interfaces that rely on movement unusable. This study aims to develop a dependent BCI system for manual end-point control of a robotic arm. A proof-of-concept system was devised using parieto-occipital alpha wave modulation and a cyclic menu with auditory cues. Users choose a movement to be executed and asynchronously stop said action when necessary. Tolerance intervals allowed users to cancel or confirm actions. Eight able-bodied subjects used the system to perform a pick-and-place task. To investigate the potential learning effects, the experiment was conducted twice over the course of two consecutive days. Subjects obtained satisfactory completion rates (84.0 ± 15.0% and 74.4 ± 34.5% for the first and second day, respectively) and high path efficiency (88.9 ± 11.7% and 92.2 ± 9.6%). Subjects took on average 439.7 ± 203.3 s to complete each task, but the robot was only in motion 10% of the time. There was no significant difference in performance between both days. The developed control scheme provided users with intuitive control, but a considerable amount of time is spent waiting for the right target (auditory cue). Implementing other brain signals may increase its speed

    Induction of Neural Plasticity Using a Low-Cost Open Source Brain-Computer Interface and a 3D-Printed Wrist Exoskeleton

    Get PDF
    Brain-computer interfaces (BCIs) have been proven to be useful for stroke rehabilitation, but there are a number of factors that impede the use of this technology in rehabilitation clinics and in home-use, the major factors including the usability and costs of the BCI system. The aims of this study were to develop a cheap 3D-printed wrist exoskeleton that can be controlled by a cheap open source BCI (OpenViBE), and to determine if training with such a setup could induce neural plasticity. Eleven healthy volunteers imagined wrist extensions, which were detected from single-trial electroencephalography (EEG), and in response to this, the wrist exoskeleton replicated the intended movement. Motor-evoked potentials (MEPs) elicited using transcranial magnetic stimulation were measured before, immediately after, and 30 min after BCI training with the exoskeleton. The BCI system had a true positive rate of 86 ± 12% with 1.20 ± 0.57 false detections per minute. Compared to the measurement before the BCI training, the MEPs increased by 35 ± 60% immediately after and 67 ± 60% 30 min after the BCI training. There was no association between the BCI performance and the induction of plasticity. In conclusion, it is possible to detect imaginary movements using an open-source BCI setup and control a cheap 3D-printed exoskeleton that when combined with the BCI can induce neural plasticity. These findings may promote the availability of BCI technology for rehabilitation clinics and home-use. However, the usability must be improved, and further tests are needed with stroke patients
    corecore